Supervised dimensionality reduction via sequential semidefinite programming

نویسندگان

  • Chunhua Shen
  • Hongdong Li
  • Michael J. Brooks
چکیده

Many dimensionality reduction problems end up with a trace quotient formulation. Since it is difficult to directly solve the trace quotient problem, traditionally the trace quotient cost function is replaced by an approximation such that generalized eigenvalue decomposition can be applied. In contrast, we directly optimize the trace quotient in this work. It is reformulated as a quasi-linear semidefinite optimization problem, which can be solved globally and efficiently using standard off-the-shelf semidefinite programming solvers. Also this optimization strategy allows one to enforce additional constraints (for example, sparseness constraints) on the projection matrix. We apply this optimization framework to a novel dimensionality reduction algorithm. The performance of the proposed algorithm is demonstrated in experiments on several UCI machine learning benchmark examples, USPS handwritten digits as well as ORL and Yale face data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear Dimensionality Reduction by Semidefinite Programming and Kernel Matrix Factorization

We describe an algorithm for nonlinear dimensionality reduction based on semidefinite programming and kernel matrix factorization. The algorithm learns a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. In earlier work, the kernel matrix was learned by maximizing the variance in feature space while preserving the distances and angles between nearest neigh...

متن کامل

Seminar Report Dimensionality Reduction and its Application to Semi-supervised Learning

The problem of finding a low dimensional structure amongst inputs that have been sampled from high dimensional manifold is known as dimensionality reduction. When viewed from a machine learning perspective, it can be directly compared with feature selection. Another way of looking at dimensionality reduction is as a preprocessing mechanism wherein, the data of high dimensionality can be process...

متن کامل

Fast Graph Laplacian Regularized Kernel Learning via Semidefinite-Quadratic-Linear Programming

Kernel learning is a powerful framework for nonlinear data modeling. Using the kernel trick, a number of problems have been formulated as semidefinite programs (SDPs). These include Maximum Variance Unfolding (MVU) (Weinberger et al., 2004) in nonlinear dimensionality reduction, and Pairwise Constraint Propagation (PCP) (Li et al., 2008) in constrained clustering. Although in theory SDPs can be...

متن کامل

Large-Scale Manifold Learning by Semidefinite Facial Reduction

The problem of nonlinear dimensionality reduction is often formulated as a semidefinite programming (SDP) problem. However, only SDP problems of limited size can be directly solved directly using current SDP solvers. To overcome this difficulty, we propose a novel SDP formulation for dimensionality reduction based on semidefinite facial reduction that significantly reduces the number of variabl...

متن کامل

Convex Formulations for Fair Principal Component Analysis

Though there is a growing body of literature on fairness for supervised learning, the problem of incorporating fairness into unsupervised learning has been less well-studied. This paper studies fairness in the context of principal component analysis (PCA). We first present a definition of fairness for dimensionality reduction, and our definition can be interpreted as saying that a reduction is ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Pattern Recognition

دوره 41  شماره 

صفحات  -

تاریخ انتشار 2008